Improved Oracle Complexity of Variance Reduced Methods for Nonsmooth Convex Stochastic Composition Optimization

نویسندگان

  • Tianyi Lin
  • Chenyou Fan
  • Mengdi Wang
چکیده

We consider the nonsmooth convex composition optimization problem where the objective isa composition of two finite-sum functions and analyze stochastic compositional variance reducedgradient (SCVRG) methods for them. SCVRG and its variants have recently drawn much atten-tion given their edge over stochastic compositional gradient descent (SCGD); but the theoreticalanalysis exclusively assumes strong convexity of the objective, which excludes several important ex-amples such as Lasso, logistic regression, principle component analysis and deep neural nets. Incontrast, we prove non-asymptotic incremental first-order oracle (IFO) complexity of SCVRG or itsnovel variants for nonsmooth convex composition optimization and show that they are provably fasterthan SCGD and gradient descent. More specifically, our method achieves the total IFO complexityof O((m+ n) log (1/ǫ) + 1/ǫ)which improves that of O(1/ǫ)and O ((m+ n)/√ǫ) obtained bySCGD and accelerated gradient descent (AGD) respectively. Experimental results confirm that ourmethods outperform several existing methods, e.g., SCGD and AGD, on sparse mean-variance opti-mization problem.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

First-Order Methods for Nonsmooth Convex Large-Scale Optimization, I: General Purpose Methods

We discuss several state-of-the-art computationally cheap, as opposed to the polynomial time interior-point algorithms, first-order methods for minimizing convex objectives over simple large-scale feasible sets. Our emphasis is on the general situation of a nonsmooth convex objective represented by deterministic/stochastic first-order oracle and on the methods which, under favorable circumstanc...

متن کامل

Accelerated Method for Stochastic Composition Optimization with Nonsmooth Regularization

Stochastic composition optimization draws much attention recently and has been successful in many emerging applications of machine learning, statistical analysis, and reinforcement learning. In this paper, we focus on the composition problem with nonsmooth regularization penalty. Previous works either have slow convergence rate, or do not provide complete convergence analysis for the general pr...

متن کامل

Stochastic Variance-Reduced Cubic Regularized Newton Method

We propose a stochastic variance-reduced cubic regularized Newton method for non-convex optimization. At the core of our algorithm is a novel semi-stochastic gradient along with a semi-stochastic Hessian, which are specifically designed for cubic regularization method. We show that our algorithm is guaranteed to converge to an ( , √ )-approximately local minimum within Õ(n/ ) second-order oracl...

متن کامل

Proximal Stochastic Methods for Nonsmooth Nonconvex Finite-Sum Optimization

We analyze stochastic algorithms for optimizing nonconvex, nonsmooth finite-sum problems, where the nonsmooth part is convex. Surprisingly, unlike the smooth case, our knowledge of this fundamental problem is very limited. For example, it is not known whether the proximal stochastic gradient method with constant minibatch converges to a stationary point. To tackle this issue, we develop fast st...

متن کامل

An efficient one-layer recurrent neural network for solving a class of nonsmooth optimization problems

Constrained optimization problems have a wide range of applications in science, economics, and engineering. In this paper, a neural network model is proposed to solve a class of nonsmooth constrained optimization problems with a nonsmooth convex objective function subject to nonlinear inequality and affine equality constraints. It is a one-layer non-penalty recurrent neural network based on the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1802.02339  شماره 

صفحات  -

تاریخ انتشار 2018